--- title: Calibration Procedure keywords: fastai sidebar: home_sidebar summary: "Every OpenHSI camera is unique and requires calibration before use. This module provides the abstractions to create the calibration data which are then used in operation. " description: "Every OpenHSI camera is unique and requires calibration before use. This module provides the abstractions to create the calibration data which are then used in operation. " nb_path: "nbs/05_calibrate.ipynb" ---
{% raw %}
{% endraw %}

{% include tip.html content='This module can be imported using from openhsi.calibrate import *' %}

{% raw %}
{% endraw %} {% raw %}
{% endraw %} {% raw %}

sum_gaussians[source]

sum_gaussians(x:indices np.array, *args:amplitude, peak position, peak width, constant)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

class SettingsBuilderMixin[source]

SettingsBuilderMixin()

{% endraw %} {% raw %}
{% endraw %} {% raw %}

class SettingsBuilderMetaclass[source]

SettingsBuilderMetaclass(clsname:str, cam_class, attrs) :: type

type(object_or_name, bases, dict)
type(object) -> the object's type
type(name, bases, dict) -> a new type
{% endraw %} {% raw %}

create_settings_builder[source]

create_settings_builder(clsname:str, cam_class:Camera Class)

Create a `SettingsBuilder` class called `clsname` based on your chosen `cam_class`.
{% endraw %} {% raw %}
{% endraw %}

Using the SettingsBuilderMixin

There are a few ways to create a SettingsBuilder class that words for your custom camera. (They involve Python metaclasses and mixins)

For example, you can then create a SettingsBuilder class that works for your custom camera by doing one of the following. {% include note.html content='Below we use the ’SimulatedCamera’ class. When calibrating a real camera you would replace SimulatedCamera with your camera class.' %}

{% raw %}
SettingsBuilder = create_settings_builder("SettingsBuilder",SimulatedCamera)

# using Metaclasses
SettingsBuilder = SettingsBuilderMetaclass("SettingsBuilder",SimulatedCamera,{})

# initialising
sb = SettingsBuilder(json_path="assets/cam_settings.json", 
                     pkl_path="assets/cam_calibration.pkl")
Allocated 69.93 MB of RAM. There was 5621.33 MB available.
{% endraw %} {% raw %}
class CalibrateOpenHSI(SettingsBuilderMixin, SimulatedCamera):
    pass

sb = CalibrateOpenHSI(mode="flat",json_path="assets/cam_settings.json", pkl_path="assets/cam_calibration.pkl")
Allocated 69.93 MB of RAM. There was 5581.38 MB available.
{% endraw %}

Find illuminated sensor area

We assume the x axis (or detector columns) are used for the spectral channels, the rows correspond to the cross-track dimension and are limited by the optics (slit). The useable area is cropped out (windowing can also be used to reduce data intake).

{% raw %}

SettingsBuilderMixin.retake_flat_field[source]

SettingsBuilderMixin.retake_flat_field(show:bool=True)

Take and store an image of with the OpenHSI slit illuminated but a uniform light source.
Type Default Details
show bool True flag to show taken image

SettingsBuilderMixin.update_row_minmax[source]

SettingsBuilderMixin.update_row_minmax(edgezone:int=4, show:bool=True)

Find edges of slit in flat field images and determine region to crop.
Type Default Details
edgezone int 4 number of pixel buffer to add to crop region
show bool True flag to show plot of slice and edges identified
{% endraw %} {% raw %}
hvimg=sb.retake_flat_field(show=True)
hvimg.opts(width=400,height=400)
print(sb.calibration["flat_field_pic"].max())
hvimg
255
{% endraw %} {% raw %}
sb.update_row_minmax()
Locs row_min: 7 and row_max: 912
{% endraw %} {% raw %}
sb.update_resolution()
{% endraw %}

Smile Correction

The emissions lines, which should be straight vertical, appear slightly curved. This is smile error (error in the spectral dimension).

{% raw %}
sb.mode_change("HgAr")
{% endraw %} {% raw %}

SettingsBuilderMixin.retake_emission_lines[source]

SettingsBuilderMixin.retake_emission_lines(show:bool=True, nframes:int=10)

Take and store an image with camera illuminated with a calibration source.
Type Default Details
show bool True flag to show
nframes int 10 number of frames to average for image

SettingsBuilderMixin.retake_HgAr[source]

SettingsBuilderMixin.retake_HgAr(show:bool=True, nframes:int=10)

Take and store an image with OpenHSI camera illuminated with a HgAr calibration source.
Type Default Details
show bool True flag to show
nframes int 10 number of frames to average for image

SettingsBuilderMixin.update_smile_shifts[source]

SettingsBuilderMixin.update_smile_shifts(show=True)

Determine Smile and shifts to correct from spectral lines image.
Type Default Details
show bool True flag to show plot of smile shifts for each cross track pixel.
{% endraw %} {% raw %}
hvimg=sb.retake_HgAr(show=True, nframes=1)
hvimg.opts(width=400,height=400)
print(sb.calibration["HgAr_pic"].max())
hvimg
255.0
{% endraw %} {% raw %}
sb.update_smile_shifts()
{% endraw %}

Map the spectral axis to wavelengths

To do this, peaks in the HgAr spectrum are found, refined by curve-fitting with Gaussians. The location of the peaks then allow for interpolation to get the map from array (column) index to wavelength (nm).

{% raw %}

SettingsBuilderMixin.fit_emission_lines[source]

SettingsBuilderMixin.fit_emission_lines(brightest_peaks:list, emission_lines:list, top_k:int=10, filter_window:int=1, interactive_peak_id:bool=False, find_peaks_height:int=10, prominence:float=0.2, width:float=1.5, distance:int=10, max_match_error:float=2.0, verbose:bool=False)

Finds the index to wavelength map given a spectra and a list of emission lines.
To filter the spectra, set `filter_window` to an odd number > 1.
Type Default Details
brightest_peaks list list of wavelength for the brightest peaks in spectral lines image
emission_lines list list of emission lines to match
top_k int 10 how many peaks to use in fit
filter_window int 1 filter window for scipy.signal.savgol_filter. Needs to be odd.
interactive_peak_id bool False flag to interactively confirm wavelength of peaks
find_peaks_height int 10 anything above this value is free game for a peak
prominence float 0.2 prominence for scipy.signal.find_peaks
width float 1.5 peak width for scipy.signal.find_peaks
distance int 10 distance for scipy.signal.find_peaks
max_match_error float 2.0 max diff between peak estimate wavelength and wavelength from line list
verbose bool False more detailed diagnostic messages

SettingsBuilderMixin.fit_HgAr_lines[source]

SettingsBuilderMixin.fit_HgAr_lines(brightest_peaks:list=[435.833, 546.074, 763.511], top_k:int=10, filter_window:int=1, interactive_peak_id:bool=False, find_peaks_height:int=10, prominence:float=0.2, width:float=1.5, distance:int=10, max_match_error:float=2.0, verbose:bool=False)

Finds the index to wavelength map given a spectra and a list of emission lines.
To filter the spectra, set `filter_window` to an odd number > 1.
Type Default Details
brightest_peaks list (435.833, 546.074, 763.511) list of wavelength for the brightest peaks in spectral lines image
top_k int 10 how many peaks to use in fit
filter_window int 1 filter window for scipy.signal.savgol_filter. Needs to be odd.
interactive_peak_id bool False flag to interactively confirm wavelength of peaks
find_peaks_height int 10 anything above this value is free game for a peak
prominence float 0.2 prominence for scipy.signal.find_peaks
width float 1.5 peak width for scipy.signal.find_peaks
distance int 10 distance for scipy.signal.find_peaks
max_match_error float 2.0 max diff between peak estimate wavelength and wavelength from line list
verbose bool False more detailed diagnostic messages
{% endraw %} {% raw %}
sb.fit_HgAr_lines(top_k=10)
{% endraw %}

Each column in our camera frame (after smile correction) corresponds to a particular wavelength. The interpolation between column index and wavelength is slightly nonlinear which is to be expected from the diffraction grating - however it is linear to good approximation. Applying a linear interpolation gives an absolute error of $\pm$3 nm whereas the a cubic interpolation used here gives an absolute error of $\pm$ 0.3 nm (approximately the spacing between each column). Using higher order polynomials doesn't improve the error due to overfitting.

For fast real time processing, the fast binning procedure assumes a linear interpolation because the binning algorithm consists of a single broadcasted summation with no additional memory allocation overhead. A slower more accurate spectral binning procedure is also provided using the cubic interpolation described here and requires hundreds of temporary arrays to be allocated each time. Binning can also be done in post processing after collecting raw data.

{% raw %}

SettingsBuilderMixin.update_intsphere_fit[source]

SettingsBuilderMixin.update_intsphere_fit(spec_rad_ref_data:str='assets/112704-1-1_1nm_data.csv', spec_rad_ref_luminance:int=52020, show:bool=True)

Type Default Details
spec_rad_ref_data str assets/112704-1-1_1nm_data.csv path to integrating sphere cal file
spec_rad_ref_luminance int 52020 reference luminance for integrating sphere
show bool True flag to show plot
{% endraw %} {% raw %}
fig = sb.update_intsphere_fit()
#sb.dump() # resave the settings and calibration files
{% endraw %}

Integrating Sphere data

4D datacube with coordinates of cross-track, wavelength, exposure, and luminance. {% include warning.html content='Needs testing!' %}

{% raw %}

SettingsBuilderMixin.update_intsphere_cube[source]

SettingsBuilderMixin.update_intsphere_cube(exposures:List[~T], luminances:List[~T], nframes:int=10, lum_chg_func:typing.Callable=print, interactive:bool=False)

Type Default Details
exposures typing.List exposure times for the camera to iterate over
luminances typing.List luminance values for the integrating sphere to iterate over
nframes int 10 how many frames to average over
lum_chg_func typing.Callable print called on each luminance value before collection starts
interactive bool False if you want to manually press enter each luminance iteration
{% endraw %} {% raw %}
luminances=[0, 1_000, 5_000, 10_000, 20_000, 40_000]
exposures = [0, 5, 8, 10, 15, 20]
sb.calibration["rad_ref"] = cam.update_intsphere_cube(exposures, luminances, noframe=50, lum_chg_func=spt.selectPreset)

# remove saturated images
cam.calibration["rad_ref"] = cam.calibration["rad_ref"].where(
    ~(np.sum((cam.calibration["rad_ref"][:, :, :, :, :] == 255), axis=(1, 2)) > 1000)
)
{% endraw %}

When you are happy with the calibration, dump the updates.

{% raw %}
#cam.dump(json_path=json_path_target, pkl_path=pkl_path_target)
{% endraw %}

SpectraPT TCP Client

Class to interact with the Spectra PT integrating sphere.

{% raw %}

class SpectraPTController[source]

SpectraPTController(lum_preset_dict:Dict[int, int]={0: 1, 1000: 2, 2000: 3, 3000: 4, 4000: 5, 5000: 6, 6000: 7, 7000: 8, 8000: 9, 9000: 10, 10000: 11, 20000: 12, 25000: 13, 30000: 14, 35000: 15, 40000: 16}, host:str='localhost', port:int=3434)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

SpectraPTController.client[source]

SpectraPTController.client(msg:str)

SpectraPTController.selectPreset[source]

SpectraPTController.selectPreset(lumtarget:float)

SpectraPTController.turnOnLamp[source]

SpectraPTController.turnOnLamp()

SpectraPTController.turnOffLamp[source]

SpectraPTController.turnOffLamp()

{% endraw %}